AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Open-Source MoE

# Open-Source MoE

Openmoe Base
Apache-2.0
OpenMoE-Base is a Mixture of Experts (MoE) base model for debugging purposes, trained on only 128 billion tokens. As part of the OpenMoE project, it aims to advance the open-source MoE community.
Large Language Model Transformers
O
OrionZheng
73
5
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase